# Spanish RoBERTa
Ruperta Base
RuPERTa is a case-insensitive RoBERTa model trained on a large Spanish corpus, using RoBERTa's improved pre-training methods, suitable for various Spanish NLP tasks.
Large Language Model Spanish
R
mrm8488
39
2
Bertin Roberta Base Spanish
BERTIN is a series of Spanish BERT-based models. The current model is a RoBERTa-base model trained from scratch on a portion of the Spanish mC4 dataset using Flax.
Large Language Model Spanish
B
bertin-project
1,845
36
Featured Recommended AI Models